EDML for Learning Parameters in Directed and Undirected Graphical Models

نویسندگان

  • Khaled S. Refaat
  • Arthur Choi
  • Adnan Darwiche
چکیده

EDML is a recently proposed algorithm for learning parameters in Bayesian networks. It was originally derived in terms of approximate inference on a meta-network which underlies the Bayesian approach to parameter estimation. While this initial derivation helped discover EDML in the first place and provided a concrete context for identifying some of its properties (e.g., in contrast to EM), the formal setting was somewhat tedious in the number of concepts it drew on. In this paper, we propose a greatly simplified perspective on EDML which casts it as a general approach to continuous optimization. The new perspective has several advantages. First, it makes immediate some results that were non-trivial to prove initially. Second, it facilitates the design of EDML algorithms for new graphical models, leading to a new algorithm for learning parameters in Markov networks. We derive this algorithm in this paper, and provide an empirical comparison with a commonly used gradient method, showing that EDML can find better estimates several times faster.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A conditional independence algorithm for learning undirected graphical models

When it comes to learning graphical models from data, approaches based on conditional independence tests are among the most popular methods. Since Bayesian networks dominate research in this field, these methods usually refer to directed graphs, and thus have to determine not only the set of edges, but also their direction. At least for a certain kind of possibilistic graphical models, however,...

متن کامل

Neural Variational Inference and Learning in Undirected Graphical Models

Many problems in machine learning are naturally expressed in the language of undirected graphical models. Here, we propose black-box learning and inference algorithms for undirected models that optimize a variational approximation to the log-likelihood of the model. Central to our approach is an upper bound on the logpartition function parametrized by a function q that we express as a flexible ...

متن کامل

Graphical Model Structure Learning with 1-Regularization

This work looks at fitting probabilistic graphical models to data when the structure is not known. The main tool to do this is `1-regularization and the more general group `1-regularization. We describe limited-memory quasi-Newton methods to solve optimization problems with these types of regularizers, and we examine learning directed acyclic graphical models with `1-regularization, learning un...

متن کامل

State estimation in discrete graphical models

p(X1:D|G, θ) (1) whereG is the graph structure (either directed or undirected or both), and θ are the parameters. In Bayesian modeling, we treat the parameters as random variables as well, but they are in turn conditioned on fixed hyper parameters α: p(X1:D, θ|G,α) (2) Clearly this can be represented as in Equation 1 by appropriately redefining X and θ. It will also be notationally helpful to d...

متن کامل

A Brief Introduction to Graphical Models and How to Learn Them from Data

Graphical Models: Core Ideas and Notions A Simple Example: How does it work in principle? Conditional Independence Graphs conditional independence and the graphoid axioms separation in (directed and undirected) graphs decomposition/factorization of distributions Evidence Propagation in Graphical Models Building Graphical Models Learning Graphical Models from Data quantitative (parameter) and qu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013